121 research outputs found
Extending Cobot's Motion Intention Visualization by Haptic Feedback
Nowadays, robots are found in a growing number of areas where they
collaborate closely with humans. Enabled by lightweight materials and safety
sensors, these cobots are gaining increasing popularity in domestic care,
supporting people with physical impairments in their everyday lives. However,
when cobots perform actions autonomously, it remains challenging for human
collaborators to understand and predict their behavior, which is crucial for
achieving trust and user acceptance. One significant aspect of predicting cobot
behavior is understanding their motion intention and comprehending how they
"think" about their actions. Moreover, other information sources often occupy
human visual and audio modalities, rendering them frequently unsuitable for
transmitting such information. We work on a solution that communicates cobot
intention via haptic feedback to tackle this challenge. In our concept, we map
planned motions of the cobot to different haptic patterns to extend the visual
intention feedback.Comment: Final CHI LBW 2023 submission:
https://dx.doi.org/10.1145/3544549.358560
How to Communicate Robot Motion Intent: A Scoping Review
Robots are becoming increasingly omnipresent in our daily lives, supporting
us and carrying out autonomous tasks. In Human-Robot Interaction, human actors
benefit from understanding the robot's motion intent to avoid task failures and
foster collaboration. Finding effective ways to communicate this intent to
users has recently received increased research interest. However, no common
language has been established to systematize robot motion intent. This work
presents a scoping review aimed at unifying existing knowledge. Based on our
analysis, we present an intent communication model that depicts the
relationship between robot and human through different intent dimensions
(intent type, intent information, intent location). We discuss these different
intent dimensions and their interrelationships with different kinds of robots
and human roles. Throughout our analysis, we classify the existing research
literature along our intent communication model, allowing us to identify key
patterns and possible directions for future research.Comment: Interactive Data Visualization of the Paper Corpus:
https://rmi.robot-research.d
Exploring AI-enhanced Shared Control for an Assistive Robotic Arm
Assistive technologies and in particular assistive robotic arms have the
potential to enable people with motor impairments to live a self-determined
life. More and more of these systems have become available for end users in
recent years, such as the Kinova Jaco robotic arm. However, they mostly require
complex manual control, which can overwhelm users. As a result, researchers
have explored ways to let such robots act autonomously. However, at least for
this specific group of users, such an approach has shown to be futile. Here,
users want to stay in control to achieve a higher level of personal autonomy,
to which an autonomous robot runs counter. In our research, we explore how
Artifical Intelligence (AI) can be integrated into a shared control paradigm.
In particular, we focus on the consequential requirements for the interface
between human and robot and how we can keep humans in the loop while still
significantly reducing the mental load and required motor skills.Comment: Workshop on Engineering Interactive Systems Embedding AI Technologies
(EIS-embedding-AI) at EICS'2
In Time and Space: Towards Usable Adaptive Control for Assistive Robotic Arms
Robotic solutions, in particular robotic arms, are becoming more frequently
deployed for close collaboration with humans, for example in manufacturing or
domestic care environments. These robotic arms require the user to control
several Degrees-of-Freedom (DoFs) to perform tasks, primarily involving
grasping and manipulating objects. Standard input devices predominantly have
two DoFs, requiring time-consuming and cognitively demanding mode switches to
select individual DoFs. Contemporary Adaptive DoF Mapping Controls (ADMCs) have
shown to decrease the necessary number of mode switches but were up to now not
able to significantly reduce the perceived workload. Users still bear the
mental workload of incorporating abstract mode switching into their workflow.
We address this by providing feed-forward multimodal feedback using updated
recommendations of ADMC, allowing users to visually compare the current and the
suggested mapping in real-time. We contrast the effectiveness of two new
approaches that a) continuously recommend updated DoF combinations or b) use
discrete thresholds between current robot movements and new recommendations.
Both are compared in a Virtual Reality (VR) in-person study against a classic
control method. Significant results for lowered task completion time, fewer
mode switches, and reduced perceived workload conclusively establish that in
combination with feedforward, ADMC methods can indeed outperform classic mode
switching. A lack of apparent quantitative differences between Continuous and
Threshold reveals the importance of user-centered customization options.
Including these implications in the development process will improve usability,
which is essential for successfully implementing robotic technologies with high
user acceptance
AdaptiX -- A Transitional XR Framework for Development and Evaluation of Shared Control Applications in Assistive Robotics
With the ongoing efforts to empower people with mobility impairments and the
increase in technological acceptance by the general public, assistive
technologies, such as collaborative robotic arms, are gaining popularity. Yet,
their widespread success is limited by usability issues, specifically the
disparity between user input and software control along the autonomy continuum.
To address this, shared control concepts provide opportunities to combine the
targeted increase of user autonomy with a certain level of computer assistance.
This paper presents the free and open-source AdaptiX XR framework for
developing and evaluating shared control applications in a high-resolution
simulation environment. The initial framework consists of a simulated robotic
arm with an example scenario in Virtual Reality (VR), multiple standard control
interfaces, and a specialized recording/replay system. AdaptiX can easily be
extended for specific research needs, allowing Human-Robot Interaction (HRI)
researchers to rapidly design and test novel interaction methods, intervention
strategies, and multi-modal feedback techniques, without requiring an actual
physical robotic arm during the early phases of ideation, prototyping, and
evaluation. Also, a Robot Operating System (ROS) integration enables the
controlling of a real robotic arm in a PhysicalTwin approach without any
simulation-reality gap. Here, we review the capabilities and limitations of
AdaptiX in detail and present three bodies of research based on the framework.
AdaptiX can be accessed at https://adaptix.robot-research.de.Comment: Accepted submission at The 16th ACM SIGCHI Symposium on Engineering
Interactive Computing Systems (EICS'24
HaptiX: Vibrotactile Haptic Feedback for Communication of 3D Directional Cues
In Human-Computer-Interaction, vibrotactile haptic feedback offers the
advantage of being independent of any visual perception of the environment.
Most importantly, the user's field of view is not obscured by user interface
elements, and the visual sense is not unnecessarily strained. This is
especially advantageous when the visual channel is already busy, or the visual
sense is limited. We developed three design variants based on different
vibrotactile illusions to communicate 3D directional cues. In particular, we
explored two variants based on the vibrotactile illusion of the cutaneous
rabbit and one based on apparent vibrotactile motion. To communicate gradient
information, we combined these with pulse-based and intensity-based mapping. A
subsequent study showed that the pulse-based variants based on the vibrotactile
illusion of the cutaneous rabbit are suitable for communicating both
directional and gradient characteristics. The results further show that a
representation of 3D directions via vibrations can be effective and beneficial.Comment: CHI EA '23, April 23-28, 2023, Hamburg, German
Pan-Cancer Analysis of lncRNA Regulation Supports Their Targeting of Cancer Genes in Each Tumor Context
Long noncoding RNAs (lncRNAs) are commonly dys-regulated in tumors, but only a handful are known toplay pathophysiological roles in cancer. We inferredlncRNAs that dysregulate cancer pathways, onco-genes, and tumor suppressors (cancer genes) bymodeling their effects on the activity of transcriptionfactors, RNA-binding proteins, and microRNAs in5,185 TCGA tumors and 1,019 ENCODE assays.Our predictions included hundreds of candidateonco- and tumor-suppressor lncRNAs (cancerlncRNAs) whose somatic alterations account for thedysregulation of dozens of cancer genes and path-ways in each of 14 tumor contexts. To demonstrateproof of concept, we showed that perturbations tar-geting OIP5-AS1 (an inferred tumor suppressor) andTUG1 and WT1-AS (inferred onco-lncRNAs) dysre-gulated cancer genes and altered proliferation ofbreast and gynecologic cancer cells. Our analysis in-dicates that, although most lncRNAs are dysregu-lated in a tumor-specific manner, some, includingOIP5-AS1, TUG1, NEAT1, MEG3, and TSIX, synergis-tically dysregulate cancer pathways in multiple tumorcontexts
Pan-cancer Alterations of the MYC Oncogene and Its Proximal Network across the Cancer Genome Atlas
Although theMYConcogene has been implicated incancer, a systematic assessment of alterations ofMYC, related transcription factors, and co-regulatoryproteins, forming the proximal MYC network (PMN),across human cancers is lacking. Using computa-tional approaches, we define genomic and proteo-mic features associated with MYC and the PMNacross the 33 cancers of The Cancer Genome Atlas.Pan-cancer, 28% of all samples had at least one ofthe MYC paralogs amplified. In contrast, the MYCantagonists MGA and MNT were the most frequentlymutated or deleted members, proposing a roleas tumor suppressors.MYCalterations were mutu-ally exclusive withPIK3CA,PTEN,APC,orBRAFalterations, suggesting that MYC is a distinct onco-genic driver. Expression analysis revealed MYC-associated pathways in tumor subtypes, such asimmune response and growth factor signaling; chro-matin, translation, and DNA replication/repair wereconserved pan-cancer. This analysis reveals insightsinto MYC biology and is a reference for biomarkersand therapeutics for cancers with alterations ofMYC or the PMN
Genomic, Pathway Network, and Immunologic Features Distinguishing Squamous Carcinomas
This integrated, multiplatform PanCancer Atlas study co-mapped and identified distinguishing
molecular features of squamous cell carcinomas (SCCs) from five sites associated with smokin
- …